Search results for "Entropy function"

showing 8 items of 8 documents

Combinatorics of theSU(2)black hole entropy in loop quantum gravity

2009

We use the combinatorial and number-theoretical methods developed in previous works by the authors to study black hole entropy in the new proposal put forth by Engle, Noui, and Perez. Specifically, we give the generating functions relevant for the computation of the entropy and use them to derive its asymptotic behavior, including the value of the Immirzi parameter and the coefficient of the logarithmic correction.

PhysicsNuclear and High Energy PhysicsConfiguration entropyImmirzi parameterTheoryofComputation_GENERALLoop quantum gravityBinary entropy functionGeneral Relativity and Quantum CosmologyTheoretical physicsClassical mechanicsQuantum gravityBlack hole thermodynamicsEntropy (arrow of time)Joint quantum entropyPhysical Review D
researchProduct

Probabilities, States, Statistics

2016

In this chapter we clarify some important notions which are relevant in a statistical theory of heat: The definitions of probability measure, and of thermodynamic states are illustrated, successively, by the classical Maxwell-Boltzmann statistics, by Fermi-Dirac statistics and by Bose-Einstein statistics. We discuss observables and their eigenvalue spectrum as well as entropy and we calculate these quantities for some examples. The chapter closes with a comparison of statistical descriptions of classical and quantum gases.

Condensed Matter::Quantum GasesBinary entropy functionEntropy (statistical thermodynamics)StatisticsLaw of total probabilityObservableBlack-body radiationStatistical theoryEigenvalues and eigenvectorsMathematicsProbability measure
researchProduct

Entropy function from toric geometry

2021

It has recently been claimed that a Cardy-like limit of the superconformal index of 4d $\mathcal{N}=4$ SYM accounts for the entropy function, whose Legendre transform corresponds to the entropy of the holographic dual AdS$_5$ rotating black hole. Here we study this Cardy-like limit for $\mathcal{N}=1$ toric quiver gauge theories, observing that the corresponding entropy function can be interpreted in terms of the toric data. Furthermore, for some families of models, we compute the Legendre transform of the entropy function, comparing with similar results recently discussed in the literature.

PhysicsHigh Energy Physics - TheoryNuclear and High Energy PhysicsSettore FIS/02 - Fisica Teorica Modelli E Metodi MatematiciQuiverFOS: Physical sciencesToric varietyBlack hole entropy Black hole microstates superconformal index AdS/CFTQC770-798Binary entropy functionLegendre transformationEntropy (classical thermodynamics)symbols.namesakeHigh Energy Physics - Theory (hep-th)Rotating black holeNuclear and particle physics. Atomic energy. RadioactivitysymbolsLimit (mathematics)Gauge theoryMathematical physicsNuclear Physics B
researchProduct

Fractional-order theory of thermoelasticicty. I: Generalization of the Fourier equation

2018

The paper deals with the generalization of Fourier-type relations in the context of fractional-order calculus. The instantaneous temperature-flux equation of the Fourier-type diffusion is generalized, introducing a self-similar, fractal-type mass clustering at the micro scale. In this setting, the resulting conduction equation at the macro scale yields a Caputo's fractional derivative with order [0,1] of temperature gradient that generalizes the Fourier conduction equation. The order of the fractional-derivative has been related to the fractal assembly of the microstructure and some preliminary observations about the thermodynamical restrictions of the coefficients and the state functions r…

Uses of trigonometryGeneralization01 natural sciences010305 fluids & plasmasScreened Poisson equationsymbols.namesakeFractional operators0103 physical sciencesFractional Fourier equationMechanics of Material010306 general physicsFourier seriesMathematicsFourier transform on finite groupsEntropy functionsHill differential equationPartial differential equationMechanical EngineeringFourier inversion theoremMathematical analysisTemperature evolutionMechanics of MaterialssymbolsFractional operatorSettore ICAR/08 - Scienza Delle CostruzioniEntropy function
researchProduct

An algorithmic construction of entropies in higher-order nonlinear PDEs

2006

A new approach to the construction of entropies and entropy productions for a large class of nonlinear evolutionary PDEs of even order in one space dimension is presented. The task of proving entropy dissipation is reformulated as a decision problem for polynomial systems. The method is successfully applied to the porous medium equation, the thin film equation and the quantum drift–diffusion model. In all cases, an infinite number of entropy functionals together with the associated entropy productions is derived. Our technique can be extended to higher-order entropies, containing derivatives of the solution, and to several space dimensions. Furthermore, logarithmic Sobolev inequalities can …

Partial differential equationDiffusion equationApplied MathematicsMathematical analysisGeneral Physics and AstronomyStatistical and Nonlinear PhysicsStrong Subadditivity of Quantum EntropySobolev inequalityBinary entropy functionNonlinear systemEntropy (energy dispersal)Mathematical PhysicsJoint quantum entropyMathematicsNonlinearity
researchProduct

On the Evaluation of Images Complexity: A Fuzzy Approach

2006

The inherently multidimensional problem of evaluating the complexity of an image is of a certain relevance in both computer science and cognitive psychology. Computer scientists usually analyze spatial dimensions, to deal with automatic vision problems, such as feature-extraction. Psychologists seem more interested in the temporal dimension of complexity, to explore attentional models. Is it possible, by merging both approaches, to define an more general index of visual complexity? We have defined a fuzzy mathematical model of visual complexity, using a specific entropy function; results obtained by applying this model to pictorial images have a strong correlation with ones from an experime…

CorrelationBinary entropy functionbusiness.industryComputer scienceFuzzy setEntropy (information theory)Artificial intelligencebusinessMachine learningcomputer.software_genreFuzzy logiccomputerVisual complexity
researchProduct

Extropy: Complementary Dual of Entropy

2015

This article provides a completion to theories of information based on entropy, resolving a longstanding question in its axiomatization as proposed by Shannon and pursued by Jaynes. We show that Shannon's entropy function has a complementary dual function which we call "extropy." The entropy and the extropy of a binary distribution are identical. However, the measure bifurcates into a pair of distinct measures for any quantity that is not merely an event indicator. As with entropy, the maximum extropy distribution is also the uniform distribution, and both measures are invariant with respect to permutations of their mass functions. However, they behave quite differently in their assessments…

Bregman divergenceFOS: Computer and information sciencesStatistics and ProbabilitySettore MAT/06 - Probabilita' E Statistica MatematicaKullback–Leibler divergenceComputer Science - Information TheoryGeneral MathematicsFOS: Physical sciencesBinary numberMathematics - Statistics TheoryStatistics Theory (math.ST)Kullback–Leibler divergenceBregman divergenceproper scoring rulesGini index of heterogeneityDifferential entropyBinary entropy functionFOS: MathematicsEntropy (information theory)Statistical physicsDual functionAxiomMathematicsdifferential and relative entropy/extropy Kullback- Leibler divergence Bregman divergence duality proper scoring rules Gini index of heterogeneity repeat rate.Settore ING-INF/05 - Sistemi Di Elaborazione Delle InformazioniDifferential and relative entropy/extropyInformation Theory (cs.IT)Probability (math.PR)repeat ratePhysics - Data Analysis Statistics and ProbabilitydualityStatistics Probability and UncertaintySettore SECS-S/01 - StatisticaMathematics - ProbabilityData Analysis Statistics and Probability (physics.data-an)Statistical Science
researchProduct

Estimating the decomposition of predictive information in multivariate systems

2015

In the study of complex systems from observed multivariate time series, insight into the evolution of one system may be under investigation, which can be explained by the information storage of the system and the information transfer from other interacting systems. We present a framework for the model-free estimation of information storage and information transfer computed as the terms composing the predictive information about the target of a multivariate dynamical process. The approach tackles the curse of dimensionality employing a nonuniform embedding scheme that selects progressively, among the past components of the multivariate process, only those that contribute most, in terms of co…

Statistics and ProbabilityComputer scienceEntropyTRANSFER ENTROPYStochastic ProcesseInformation Storage and RetrievalheartAPPROXIMATE ENTROPYMaximum entropy spectral estimationInformation theoryGRANGER CAUSALITYJoint entropyNonlinear DynamicMECHANISMSBinary entropy functionTheoreticalHeart RateModelsInformationSLEEP EEGStatisticsOSCILLATIONSTOOLEntropy (information theory)Multivariate AnalysiElectroencephalography; Entropy; Heart Rate; Information Storage and Retrieval; Linear Models; Nonlinear Dynamics; Sleep; Stochastic Processes; Models Theoretical; Multivariate AnalysisConditional entropyStochastic ProcessesHEART-RATE-VARIABILITYCOMPLEXITYConditional mutual informationBrainElectroencephalographyModels TheoreticalScience GeneralCondensed Matter PhysicscardiorespiratoryNonlinear DynamicsPHYSIOLOGICAL TIME-SERIESSettore ING-INF/06 - Bioingegneria Elettronica E InformaticaMultivariate AnalysisLinear ModelsLinear ModelTransfer entropySleepAlgorithmStatistical and Nonlinear Physic
researchProduct